Local Gradient Descent Methods for GMM Simplification
ثبت نشده
چکیده
Gaussian mixture model simplification is a powerful technique for reducing the number of components of an existing mixture model without having to re-cluster the original data set. Instead, a simplified GMM with fewer components is computed by minimizing some distance metric between the two models. In this paper, we derive an analytical expression for the difference between the probability density functions of two GMMs along with its gradient information. We minimize the objective function using gradient descent and K-means. Both synthetic and non-synthetic test cases are used in the experiments.
منابع مشابه
Sliced Wasserstein Distance for Learning Gaussian Mixture Models
Gaussian mixture models (GMM) are powerful parametric tools with many applications in machine learning and computer vision. Expectation maximization (EM) is the most popular algorithm for estimating the GMM parameters. However, EM guarantees only convergence to a stationary point of the log-likelihood function, which could be arbitrarily worse than the optimal solution. Inspired by the relation...
متن کاملBoosting GMM and Its Two Applications
Boosting is an effecient method to improve the classification performance. Recent theoretical work has shown that the boosting technique can be viewed as a gradient descent search for a good fit in function space. Several authors have applied such viewpoint to solve the density estimation problems. In this paper we generalize such framework to a specific density model – Gaussian Mixture Model (...
متن کاملExtensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property
Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...
متن کاملDiscriminative Training of GMM for Language Identificatio..
In this paper, a discriminative training procedure for a Gaussian Mixture Model (GMM) language identification system is described. The proposal is based on the Generalized Probabilistic Descent (GPD) algorithm and Minimum Classification Error Rates formulated to estimate the GMM parameters. The evaluation is conducted using the OGI multi-language telephone speech corpus. The experimental result...
متن کاملAn eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011